CS 264 : Beyond Worst - Case Analysis Lectures # 11 and 12 : SDP Algorithms for Semi - Random Bisection and Clique ∗

نویسنده

  • Tim Roughgarden
چکیده

Lectures #9 and #10 studied the planted bisection (a.k.a. community detection) and planted clique models, where we posited specific parameterized input distributions (parameterized by edge densities p and q within and between clusters, or by planted the clique size k), in which a “clearly optimal” solution is planted in an otherwise random graph. The goal was to identify necessary and sufficient conditions on the parameters (p − q, or k) of the distribution such that the planted solution can be recovered in polynomial time. We obtained state-of-the-art positive results for these problems, for p − q = Ω( √ logn n ) and k = Ω( √ n). We obtained these results using spectral algorithms, which compute the second eigenvector of the adjacency matrix followed by some problem-specific postprocessing. How should we feel about these results? Technically, the results are quite interesting. And in the end, the theory ends up advocating natural algorithms that are not overly tailored to the assumed input distributions, so there is hope that these algorithms could perform well much more generally. Indeed, at least for graph partitioning, spectral algorithms constitute one of the dominant paradigms in practice. Still, one can’t help but notice that our analysis of these spectral algorithms strongly exploited the assumed input distribution (e.g., through bounds on the eigenvalues of random mean-zero symmetric matrices). Can we do better? That is, can we obtain more robust versions of our recovery results, that assume much less about the underlying input distribution? The answer is yes, although we will have to up our game—spectral algorithms will no longer be sufficiently powerful, and we’ll need to rely on semidefinite-programming (SDP)-based algorithms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

CS 264 : Beyond Worst - Case Analysis Lecture # 19 : Self - Improving Algorithms ∗

The last several lectures discussed several interpolations between worst-case analysis and average-case analysis designed to identify robust algorithms in the face of strong impossibility results for worst-case guarantees. This lecture gives another analysis framework that blends aspects of worstand average-case analysis. In today’s model of self-improving algorithms, an adversary picks an inpu...

متن کامل

CS 264 : Beyond Worst - Case Analysis Lecture

The last few lectures discussed several interpolations between worst-case and average-case analysis designed to identify robust algorithms in the face of strong impossibility results for worst-case guarantees. This lecture gives another analysis framework that blends aspects of worstand average-case analysis. In today’s model of self-improving algorithms, an adversary picks an input distributio...

متن کامل

Coping with Np-hardness: Approximating Minimum Bisection and Heuristics for Maximum Clique 2 Approximating Minimum Bisection 13 1.2 Approximation Algorithms

Many important optimization problems are known to be NP-hard. That is, unless P = NP, there is no polynomial time algorithm that optimally solves these problems on every input instance. We study algorithmic ways for \coping" with NP-hard optimization problems. One possible approach for coping with the NP-hardness is to relax the requirement for exact solution, and devise approximation algorithm...

متن کامل

CS 264 : Beyond Worst - Case Analysis

This lectures touches on results of Afshani, Barbay, and Chan [1], who give a number of interesting instance-optimality results for fundamental problems in computational geometry, namely the problems of computing the maximal points or the convex hull of a point set in two or three dimensions. These are perhaps the most compelling examples to date of instanceoptimal algorithms when the cost meas...

متن کامل

CS 264 : Beyond Worst - Case Analysis Lecture # 17 : Smoothed Analysis of Local Search ∗

This lecture and the next are about smoothed analysis, which is perhaps the most wellstudied “hybrid” analysis framework, blending average-case and worst-case analysis. (Recall our previous examples: semi-random graph models and the random-order model for online algorithms.) In smoothed analysis, an adversary first picks an input, and nature subsequently adds a “small” perturbation to it. This ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017